A Cascade Network Algorithm Employing Progressive RPROP
نویسندگان
چکیده
N.K. Treadgold and T.D. Gedeon School of Computer Science & Engineering The University of New South Wales Sydney N.S.W. 2052 AUSTRALIA { nickt | tom }@cse.unsw.edu.au ABSTRACT Cascade Correlation (Cascor) has proved to be a powerful method for training neural networks. Cascor, however, has been shown not to generalise well on regression and some classification problems. A new Cascade network algorithm employing Progressive RPROP (Casper), is proposed. Casper, like Cascor, is a constructive learning algorithm which builds cascade networks. Instead of using weight freezing and a correlation measure to install new neurons, however, Casper uses a variation of RPROP to train the whole network. Casper is shown to produce more compact networks, which generalise better than Cascor.
منابع مشابه
Extending CasPer: A Regression Survey
The CasPer algorithm is a constructive neural network algorithm. CasPer creates cascade network architectures in a similar manner to Cascade Correlation. CasPer, however, uses a modified form of the RPROP algorithm, termed Progressive RPROP, to train the whole network after the addition of each new hidden neuron. Previous work with CasPer has shown that it builds networks which generalise bette...
متن کاملExtending and Benchmarking the CasPer Algorithm
The CasPer algorithm is a constructive neural network algorithm. CasPer creates cascade network architectures in a similar manner to Cascade Correlation. CasPer, however, uses a modified form of the RPROP algorithm, termed Progressive RPROP, to train the whole network after the addition of each new hidden neuron. Previous work with CasPer has shown that it builds networks which generalise bette...
متن کاملVision-Based Low-Level Navigation using a Feed-Forward Neural Network
In this paper we propose a simple method for low-level navigation for autonomous mobile robots, employing an artificial neural network. Both corridor following and obstacle avoidance in indoor environments are managed by the same network. Raw grayscale images of size 32 × 23 pixels are processed one at a time by a feed-forward neural network. The output signals from the network directly control...
متن کاملNeural Network Architectures and Learning
Abstract Various leaning method of neural networks including supervised and unsupervised methods are presented and illustrated with examples. General learning rule as a function of the incoming signals is discussed. Other learning rules such as Hebbian learning, perceptron learning, LMS Least Mean Square learning, delta learning, WTA – Winner Take All learning, and PCA Principal Component Analy...
متن کاملAn Efficient Improvement of the Rprop Algorithm
This paper introduces an efficient modification of the Rprop algorithm for training neural networks. The convergence of the new algorithm can be justified theoretically, and its performance is investigated empirically through simulation experiments using some pattern classification benchmarks. Numerical evidence shows that the algorithm exhibits improved learning speed in all cases, and compare...
متن کامل